28 research outputs found

    Towards Intelligent Playful Environments for Animals based on Natural User Interfaces

    Full text link
    Tesis por compendioEl estudio de la interacción de los animales con la tecnología y el desarrollo de sistemas tecnológicos centrados en el animal está ganando cada vez más atención desde la aparición del área de Animal Computer Interaction (ACI). ACI persigue mejorar el bienestar de los animales en diferentes entornos a través del desarrollo de tecnología adecuada para ellos siguiendo un enfoque centrado en el animal. Entre las líneas de investigación que ACI está explorando, ha habido bastante interés en la interacción de los animales con la tecnología basada en el juego. Las actividades de juego tecnológicas tienen el potencial de proveer estimulación mental y física a los animales en diferentes contextos, pudiendo ayudar a mejorar su bienestar. Mientras nos embarcamos en la era de la Internet de las Cosas, las actividades de juego tecnológicas actuales para animales todavía no han explorado el desarrollo de soluciones pervasivas que podrían proveerles de más adaptación a sus preferencias a la vez que ofrecer estímulos tecnológicos más variados. En su lugar, estas actividades están normalmente basadas en interacciones digitales en lugar de explorar dispositivos tangibles o aumentar las interacciones con otro tipo de estímulos. Además, estas actividades de juego están ya predefinidas y no cambian con el tiempo, y requieren que un humano provea el dispositivo o la tecnología al animal. Si los humanos pudiesen centrarse más en su participación como jugadores de un sistema interactivo para animales en lugar de estar pendientes de sujetar un dispositivo para el animal o de mantener el sistema ejecutándose, esto podría ayudar a crear lazos más fuertes entre especies y promover mejores relaciones con los animales. Asimismo, la estimulación mental y física de los animales son aspectos importantes que podrían fomentarse si los sistemas de juego diseñados para ellos pudieran ofrecer un variado rango de respuestas, adaptarse a los comportamientos del animal y evitar que se acostumbre al sistema y pierda el interés. Por tanto, esta tesis propone el diseño y desarrollo de entornos tecnológicos de juego basados en Interfaces Naturales de Usuario que puedan adaptarse y reaccionar a las interacciones naturales de los animales. Estos entornos pervasivos permitirían a los animales jugar por si mismos o con una persona, ofreciendo actividades de juego más dinámicas y atractivas capaces de adaptarse con el tiempo.L'estudi de la interacció dels animals amb la tecnologia i el desenvolupament de sistemes tecnològics centrats en l'animal està guanyant cada vegada més atenció des de l'aparició de l'àrea d'Animal Computer Interaction (ACI) . ACI persegueix millorar el benestar dels animals en diferents entorns a través del desenvolupament de tecnologia adequada per a ells amb un enfocament centrat en l'animal. Entre totes les línies d'investigació que ACI està explorant, hi ha hagut prou interès en la interacció dels animals amb la tecnologia basada en el joc. Les activitats de joc tecnològiques tenen el potencial de proveir estimulació mental i física als animals en diferents contextos, podent ajudar a millorar el seu benestar. Mentre ens embarquem en l'era de la Internet de les Coses, les activitats de joc tecnològiques actuals per a animals encara no han explorat el desenvolupament de solucions pervasives que podrien proveir-los de més adaptació a les seues preferències al mateix temps que oferir estímuls tecnològics més variats. En el seu lloc, estes activitats estan normalment basades en interaccions digitals en compte d'explorar dispositius tangibles o augmentar les interaccions amb estímuls de diferent tipus. A més, aquestes activitats de joc estan ja predefinides i no canvien amb el temps, mentre requereixen que un humà proveïsca el dispositiu o la tecnologia a l'animal. Si els humans pogueren centrar-se més en la seua participació com a jugadors actius d'un sistema interactiu per a animals en compte d'estar pendents de subjectar un dispositiu per a l'animal o de mantenir el sistema executant-se, açò podria ajudar a crear llaços més forts entre espècies i promoure millors relacions amb els animals. Així mateix, l'estimulació mental i física dels animals són aspectes importants que podrien fomentar-se si els sistemes de joc dissenyats per a ells pogueren oferir un rang variat de respostes, adaptar-se als comportaments de l'animal i evitar que aquest s'acostume al sistema i perda l'interès. Per tant, esta tesi proposa el disseny i desenvolupament d'entorns tecnològics de joc basats en Interfícies Naturals d'Usuari que puguen adaptar-se i reaccionar a les interaccions naturals dels animals. Aquestos escenaris pervasius podrien permetre als animals jugar per si mateixos o amb una persona, oferint activitats de joc més dinàmiques i atractives que siguen capaces d'adaptar-se amb el temps.The study of animals' interactions with technology and the development of animal-centered technological systems is gaining attention since the emergence of the research area of Animal Computer Interaction (ACI). ACI aims to improve animals' welfare and wellbeing in several scenarios by developing suitable technology for the animal following an animal-centered approach. Among all the research lines ACI is exploring, there has been significant interest in animals' playful interactions with technology. Technologically mediated playful activities have the potential to provide mental and physical stimulation for animals in different environmental contexts, which could in turn help to improve their wellbeing. As we embark in the era of the Internet of Things, current technological playful activities for animals have not yet explored the development of pervasive solutions that could provide animals with more adaptation to their preferences as well as offering varied technological stimuli. Instead, playful technology for animals is usually based on digital interactions rather than exploring tangible devices or augmenting the interactions with different stimuli. In addition, these playful activities are already predefined and do not change over time, while they require that a human has to be the one providing the device or technology to the animal. If humans could focus more on their participation as active players of an interactive system aimed for animals instead of being concerned about holding a device for the animal or keep the system running, this might help to create stronger bonds between species and foster better relationships with animals. Moreover, animals' mental and physical stimulation are important aspects that could be fostered if the playful systems designed for animals could offer a varied range of outputs, be tailored to the animal's behaviors and prevented the animal to get used to the system and lose interest. Therefore, this thesis proposes the design and development of technological playful environments based on Natural User Interfaces that could adapt and react to the animals' natural interactions. These pervasive scenarios would allow animals to play by themselves or with a human, providing more engaging and dynamic playful activities that are capable of adapting over time.Pons Tomás, P. (2018). Towards Intelligent Playful Environments for Animals based on Natural User Interfaces [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/113075TESISCompendi

    Personalización de comportamiento en ambientes inteligentes empleando una herramienta para superficies interactivas

    Full text link
    [ES] Los espacios inteligentes se están convirtiendo en un futuro prometedor que aportarán indudables mejoras a la sociedad a medida que el número de dispositivos inteligentes capaces de formar parte de dichos entornos crece rápidamente. Puesto que la combinación de tantos y tan variados elementos inteligentes en un mismo espacio abre la puerta a nuevas oportunidades para mejorar la calidad de vida de las personas, también hace más complejo el desarrollo de sistemas capaces de adaptarse a la perfección a cualquier contexto o usuario. Y más todavía si se tiene en cuenta la naturaleza cambiante de los entornos o de las preferencias de las personas que los habitan. Por tanto, la personalización del comportamiento de estos sistemas inteligentes por parte de los usuarios finales de los mismos se plantea como una tarea de sumo interés. Sin embargo, puesto que estos usuarios no son expertos en el sistema ni suelen tener conocimientos de programación, es necesario proveer mecanismos sencillos para llevar a cabo una tarea compleja como es la especificación de comportamiento. Este trabajo presenta un editor visual basado en expresiones de flujos de datos para superficies interactivas que permitirá definir reglas de comportamiento en los futuros espacios inteligentes de manera natural y sencilla. Se ha llevado a cabo un estudio para demostrar la usabilidad del editor en base a la capacidad de usuarios no programadores de comprender las abstracciones y conceptos en los que se basa la herramienta, la facilidad de uso de la interfaz visual propuesta y la adecuación de los mecanismos de interacción multitacto y tangibles utilizados en la edición. Los resultados de este trabajo han demostrado que usuarios sin conocimientos previos de programación son capaces de utilizar correctamente la herramienta propuesta para definir reglas de comportamiento de cierta complejidad en el contexto de una casa inteligente. El estudio realizado ha permitido, además, detectar problemas leves de usabilidad a los que se ofrecerá solución en próximas versiones de la herramienta, y han surgido diversas líneas de actuación para acercar y hacer todavía más natural la personalización de ambientes reactivos.[EN] Smart environments are becoming a promising future that will bring improvements to our society as the number of intelligent devices capable of taking part in those environments is increasing very rapidly. As the combination of diverse and heterogeneous devices within the same environment will open a wide range of new opportunities to make our lives easier, developing systems which adapt perfectly to any context or user becomes a difficult task. If we consider the dynamic nature of the environments and their dwellers¿ preferences, it becomes even more complex. For this reason, allowing end-users to define behavior in their own smart environments stands as a very interesting feature. However, these users are non experts and usually do not have programming knowledge, so that easy mechanisms must be provided for performing such a demanding exercise. This work presents a purely visual rule editor based on dataflow expressions for interactive tabletops which allows behavior to be specified in smart environments. An experiment was carried out aimed at evaluating the usability of the editor in terms of non-programmers¿ understanding of the abstractions and concepts involved in the rule model, ease of use of the proposed visual interface and the suitability of the interaction mechanisms implemented in the editing tool. The study revealed that users with no previous programming experience were able to master the proposed rule model and editing tool for specifying behavior in the context of a smart home. Moreover, this study has revealed minor usability issues that will be solved in future versions of the editing tool, and several future work trends have emerged for making smart environments customization even more natural.Pons Tomás, P. (2013). Personalización de comportamiento en ambientes inteligentes empleando una herramienta para superficies interactivas. http://hdl.handle.net/10251/37596Archivo delegad

    DaFRULE: un modelo de reglas enriquecido mediante flujos de datos para la definición visual de comportamientos en entornos reactivos

    Full text link
    La definición de reglas de comportamiento para entornos reactivos es una tarea compleja para aquellos usuarios no expertos o sin conocimientos de programación previos. Las herramientas disponibles hasta la fecha para facilitar esta tarea o bien son poco flexibles y expresivas en cuanto a las reglas que permiten definir, o bien la expresividad es adecuada pero la herramienta es compleja de entender por usuarios no programadores. Dada la necesidad de proveer mecanismos que faciliten la tarea de edición de reglas de comportamiento, se ha desarrollado un lenguaje de reglas expresivo y genérico aplicable a diversos ámbitos. Se ha desarrollado un editor que hace uso de dicho lenguaje, así como un motor de procesamiento de reglas que permite emplear las reglas definidas con este lenguaje en la simulación de entornos reactivos. Se han llevado a cabo experimentos para validar la corrección del lenguaje propuesto, y la escalabilidad del motor de procesamiento de reglas.Pons Tomás, P. (2012). DaFRULE: un modelo de reglas enriquecido mediante flujos de datos para la definición visual de comportamientos en entornos reactivos. http://hdl.handle.net/10251/16539.Archivo delegad

    Interactive spaces for children: gesture elicitation for controlling ground mini-robots

    Full text link
    [EN] Interactive spaces for education are emerging as a mechanism for fostering children's natural ways of learning by means of play and exploration in physical spaces. The advanced interactive modalities and devices for such environments need to be both motivating and intuitive for children. Among the wide variety of interactive mechanisms, robots have been a popular research topic in the context of educational tools due to their attractiveness for children. However, few studies have focused on how children would naturally interact and explore interactive environments with robots. While there is abundant research on full-body interaction and intuitive manipulation of robots by adults, no similar research has been done with children. This paper therefore describes a gesture elicitation study that identified the preferred gestures and body language communication used by children to control ground robots. The results of the elicitation study were used to define a gestural language that covers the different preferences of the gestures by age group and gender, with a good acceptance rate in the 6-12 age range. The study also revealed interactive spaces with robots using body gestures as motivating and promising scenarios for collaborative or remote learning activities.This work is funded by the European Development Regional Fund (EDRF-FEDER) and supported by the Spanish MINECO (TIN2014-60077-R). The work of Patricia Pons is supported by a national grant from the Spanish MECD (FPU13/03831). Special thanks are due to the children and teachers of the Col-legi Public Vicente Gaos for their valuable collaboration and dedication.Pons Tomás, P.; Jaén Martínez, FJ. (2020). Interactive spaces for children: gesture elicitation for controlling ground mini-robots. Journal of Ambient Intelligence and Humanized Computing. 11(6):2467-2488. https://doi.org/10.1007/s12652-019-01290-6S24672488116Alborzi H, Hammer J, Kruskal A et al (2000) Designing StoryRooms: interactive storytelling spaces for children. In: Proceedings of the conference on designing interactive systems processes, practices, methods, and techniques—DIS’00. ACM Press, New York, pp 95–104Antle AN, Corness G, Droumeva M (2009) What the body knows: exploring the benefits of embodied metaphors in hybrid physical digital environments. Interact Comput 21:66–75. https://doi.org/10.1016/j.intcom.2008.10.005Belpaeme T, Baxter PE, Read R et al (2013) Multimodal child–robot interaction: building social bonds. J Human-Robot Interact 1:33–53. https://doi.org/10.5898/JHRI.1.2.BelpaemeBenko H, Wilson AD, Zannier F, Benko H (2014) Dyadic projected spatial augmented reality. In: Proceedings of the 27th annual ACM symposium on user interface software and technology—UIST’14, pp 645–655Bobick AF, Intille SS, Davis JW et al (1999) The KidsRoom: a perceptually-based interactive and immersive story environment. Presence Teleoper Virtual Environ 8:367–391. https://doi.org/10.1162/105474699566297Bonarini A, Clasadonte F, Garzotto F, Gelsomini M (2015) Blending robots and full-body interaction with large screens for children with intellectual disability. In: Proceedings of the 14th international conference on interaction design and children—IDC’15. ACM Press, New York, pp 351–354Cauchard JR, E JL, Zhai KY, Landay JA (2015) Drone & me: an exploration into natural human–drone interaction. In: Proceedings of the 2015 ACM international joint conference on pervasive and ubiquitous computing—UbiComp’15. ACM Press, New York, pp 361–365Connell S, Kuo P-Y, Liu L, Piper AM (2013) A Wizard-of-Oz elicitation study examining child-defined gestures with a whole-body interface. In: Proceedings of the 12th international conference on interaction design and children—IDC’13. ACM Press, New York, pp 277–280Derboven J, Van Mechelen M, Slegers K (2015) Multimodal analysis in participatory design with children. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems—CHI’15. ACM Press, New York, pp 2825–2828Dong H, Danesh A, Figueroa N, El Saddik A (2015) An elicitation study on gesture preferences and memorability toward a practical hand-gesture vocabulary for smart televisions. IEEE Access 3:543–555. https://doi.org/10.1109/ACCESS.2015.2432679Druin A (1999) Cooperative inquiry: developing new technologies for children with children. In: Proceedings of the SIGCHI conference on human factors computer system CHI is limit—CHI’99, vol 14, pp 592–599. https://doi.org/10.1145/302979.303166Druin A (2002) The role of children in the design of new technology. Behav Inf Technol 21:1–25. https://doi.org/10.1080/01449290110108659Druin A, Bederson B, Boltman A et al (1999) Children as our technology design partners. In: Druin A (ed) The design of children’s technology. Morgan Kaufman, San Francisco, pp 51–72Epps J, Lichman S, Wu M (2006) A study of hand shape use in tabletop gesture interaction. CHI’06 extended abstracts on human factors in computing systems—CHI EA’06. ACM Press, New York, pp 748–753Fender AR, Benko H, Wilson A (2017) MeetAlive : room-scale omni-directional display system for multi-user content and control sharing. In: Proceedings of the 2017 ACM international conference on interactive surfaces and spaces, pp 106–115Fernandez RAS, Sanchez-Lopez JL, Sampedro C et al (2016) Natural user interfaces for human–drone multi-modal interaction. In: 2016 international conference on unmanned aircraft systems (ICUAS). IEEE, New York, pp 1013–1022Garcia-Sanjuan F, Jaen J, Nacher V, Catala A (2015) Design and evaluation of a tangible-mediated robot for kindergarten instruction. In: Proceedings of the 12th international conference on advances in computer entertainment technology—ACE’15. ACM Press, New York, pp 1–11Garcia-Sanjuan F, Jaen J, Jurdi S (2016) Towards encouraging communication in hospitalized children through multi-tablet activities. In: Proceedings of the XVII international conference on human computer interaction, pp 29.1–29.4Gindling J, Ioannidou A, Loh J et al (1995) LEGOsheets: a rule-based programming, simulation and manipulation environment for the LEGO programmable brick. In: Proceedings of symposium on visual languages. IEEE Computer Society Press, New York, pp 172–179Gonzalez B, Borland J, Geraghty K (2009) Whole body interaction for child-centered multimodal language learning. In: Proceedings of the 2nd workshop on child, computer and interaction—WOCCI’09. ACM Press, New York, pp 1–5Grønbæk K, Iversen OS, Kortbek KJ et al (2007) Interactive floor support for kinesthetic interaction in children learning environments. In: Human–computer interaction—INTERACT 2007. Lecture notes in computer science, pp 361–375Guha ML, Druin A, Chipman G et al (2005) Working with young children as technology design partners. Commun ACM 48:39–42. https://doi.org/10.1145/1039539.1039567Hansen JP, Alapetite A, MacKenzie IS, Møllenbach E (2014) The use of gaze to control drones. In: Proceedings of the symposium on eye tracking research and applications—ETRA’14. ACM Press, New York, pp 27–34Henkemans OAB, Bierman BPB, Janssen J et al (2017) Design and evaluation of a personal robot playing a self-management education game with children with diabetes type 1. Int J Hum Comput Stud 106:63–76. https://doi.org/10.1016/j.ijhcs.2017.06.001Horn MS, Crouser RJ, Bers MU (2011) Tangible interaction and learning: the case for a hybrid approach. Pers Ubiquitous Comput 16:379–389. https://doi.org/10.1007/s00779-011-0404-2Hourcade JP (2015) Child computer interaction. CreateSpace Independent Publishing Platform, North CharlestonHöysniemi J, Hämäläinen P, Turkki L (2004) Wizard of Oz prototyping of computer vision based action games for children. Proceeding of the 2004 conference on interaction design and children building a community—IDC’04. ACM Press, New York, pp 27–34Höysniemi J, Hämäläinen P, Turkki L, Rouvi T (2005) Children’s intuitive gestures in vision-based action games. Commun ACM 48:44–50. https://doi.org/10.1145/1039539.1039568Hsiao H-S, Chen J-C (2016) Using a gesture interactive game-based learning approach to improve preschool children’s learning performance and motor skills. Comput Educ 95:151–162. https://doi.org/10.1016/j.compedu.2016.01.005Jokela T, Rezaei PP, Väänänen K (2016) Using elicitation studies to generate collocated interaction methods. In: Proceedings of the 18th international conference on human–computer interaction with mobile devices and services adjunct, pp 1129–1133. https://doi.org/10.1145/2957265.2962654Jones B, Benko H, Ofek E, Wilson AD (2013) IllumiRoom: peripheral projected illusions for interactive experiences. In: Proceedings of the SIGCHI conference on human factors in computing systems—CHI’13, pp 869–878Jones B, Shapira L, Sodhi R et al (2014) RoomAlive: magical experiences enabled by scalable, adaptive projector-camera units. In: Proceedings of the 27th annual ACM symposium on user interface software and technology—UIST’14, pp 637–644Kaminski M, Pellino T, Wish J (2002) Play and pets: the physical and emotional impact of child-life and pet therapy on hospitalized children. Child Heal Care 31:321–335. https://doi.org/10.1207/S15326888CHC3104_5Karam M, Schraefel MC (2005) A taxonomy of gestures in human computer interactions. In: Technical report in electronics and computer science, pp 1–45Kistler F, André E (2013) User-defined body gestures for an interactive storytelling scenario. Lect Notes Comput Sci (including subser Lect Notes Artif Intell Lect Notes Bioinform) 8118:264–281. https://doi.org/10.1007/978-3-642-40480-1_17Konda KR, Königs A, Schulz H, Schulz D (2012) Real time interaction with mobile robots using hand gestures. In: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction—HRI’12. ACM Press, New York, pp 177–178Kray C, Nesbitt D, Dawson J, Rohs M (2010) User-defined gestures for connecting mobile phones, public displays, and tabletops. In: Proceedings of the 12th international conference on human computer interaction with mobile devices and services—MobileHCI’10. ACM Press, New York, pp 239–248Kurdyukova E, Redlin M, André E (2012) Studying user-defined iPad gestures for interaction in multi-display environment. In: Proceedings of the 2012 ACM international conference on intelligent user interfaces—IUI’12. ACM Press, New York, pp 93–96Lambert V, Coad J, Hicks P, Glacken M (2014) Social spaces for young children in hospital. Child Care Health Dev 40:195–204. https://doi.org/10.1111/cch.12016Lee S-S, Chae J, Kim H et al (2013) Towards more natural digital content manipulation via user freehand gestural interaction in a living room. In: Proceedings of the 2013 ACM international joint conference on pervasive and ubiquitous computing—UbiComp’13. ACM Press, New York, p 617Malinverni L, Mora-Guiard J, Pares N (2016) Towards methods for evaluating and communicating participatory design: a multimodal approach. Int J Hum Comput Stud 94:53–63. https://doi.org/10.1016/j.ijhcs.2016.03.004Mann HB, Whitney DR (1947) On a test of whether one of two random variables is stochastically larger than the other. Ann Math Stat 18:50–60. https://doi.org/10.1214/aoms/1177730491Marco J, Cerezo E, Baldassarri S et al (2009) Bringing tabletop technologies to kindergarten children. In: Proceedings of the 23rd British HCI Group annual conference on people and computers: celebrating people and technology, pp 103–111Michaud F, Caron S (2002) Roball, the rolling robot. Auton Robots 12:211–222. https://doi.org/10.1023/A:1014005728519Micire M, Desai M, Courtemanche A et al (2009) Analysis of natural gestures for controlling robot teams on multi-touch tabletop surfaces. In: Proceedings of the ACM international conference on interactive tabletops and surfaces—ITS’09. ACM Press, New York, pp 41–48Mora-Guiard J, Crowell C, Pares N, Heaton P (2016) Lands of fog: helping children with autism in social interaction through a full-body interactive experience. In: Proceedings of the 15th international conference on interaction design and children—IDC’16. ACM Press, New York, pp 262–274Morris MR (2012) Web on the wall: insights from a multimodal interaction elicitation study. In: Proceedings of the 2012 ACM international conference on interactive tabletops and surfaces. ACM Press, New York, pp 95–104Morris MR, Wobbrock JO, Wilson AD (2010) Understanding users’ preferences for surface gestures. Proc Graph Interface 2010:261–268Nacher V, Garcia-Sanjuan F, Jaen J (2016) Evaluating the usability of a tangible-mediated robot for kindergarten children instruction. In: 2016 IEEE 16th international conference on advanced learning technologies (ICALT). IEEE, New York, pp 130–132Nahapetyan VE, Khachumov VM (2015) Gesture recognition in the problem of contactless control of an unmanned aerial vehicle. Optoelectron Instrum Data Process 51:192–197. https://doi.org/10.3103/S8756699015020132Obaid M, Häring M, Kistler F et al (2012) User-defined body gestures for navigational control of a humanoid robot. In: Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics), pp 367–377Obaid M, Kistler F, Häring M et al (2014) A framework for user-defined body gestures to control a humanoid robot. Int J Soc Robot 6:383–396. https://doi.org/10.1007/s12369-014-0233-3Obaid M, Kistler F, Kasparavičiūtė G, et al (2016) How would you gesture navigate a drone?: a user-centered approach to control a drone. In: Proceedings of the 20th international academic Mindtrek conference—AcademicMindtrek’16. ACM Press, New York, pp 113–121Pares N, Soler M, Sanjurjo À et al (2005) Promotion of creative activity in children with severe autism through visuals in an interactive multisensory environment. In: Proceeding of the 2005 conference on interaction design and children—IDC’05. ACM Press, New York, pp 110–116Pfeil K, Koh SL, LaViola J (2013) Exploring 3D gesture metaphors for interaction with unmanned aerial vehicles. In: Proceedings of the 2013 international conference on intelligent user interfaces—IUI’13, pp 257–266. https://doi.org/10.1145/2449396.2449429Piaget J (1956) The child’s conception of space. Norton, New YorkPiaget J (1973) The child and reality: problems of genetic psychology. Grossman, New YorkPiumsomboon T, Clark A, Billinghurst M, Cockburn A (2013) User-defined gestures for augmented reality. CHI’13 extended abstracts on human factors in computing systems—CHI EA’13. ACM Press, New York, pp 955–960Pons P, Carrión A, Jaen J (2018) Remote interspecies interactions: improving humans and animals’ wellbeing through mobile playful spaces. Pervasive Mob Comput. https://doi.org/10.1016/j.pmcj.2018.12.003Puranam MB (2005) Towards full-body gesture analysis and recognition. University of Kentucky, LexingtonPyryeskin D, Hancock M, Hoey J (2012) Comparing elicited gestures to designer-created gestures for selection above a multitouch surface. In: Proceedings of the 2012 ACM international conference on interactive tabletops and surfaces—ITS’12. ACM Press, New York, pp 1–10Raffle HS, Parkes AJ, Ishii H (2004) Topobo: a constructive assembly system with kinetic memory. System 6:647–654. https://doi.org/10.1145/985692.985774Read JC, Markopoulos P (2013) Child–computer interaction. Int J Child-Comput Interact 1:2–6. https://doi.org/10.1016/j.ijcci.2012.09.001Read JC, Macfarlane S, Casey C (2002) Endurability, engagement and expectations: measuring children’s fun. In: Interaction design and children, pp 189–198Read JC, Markopoulos P, Parés N et al (2008) Child computer interaction. In: Proceeding of the 26th annual CHI conference extended abstracts on human factors in computing systems—CHI’08. ACM Press, New York, pp 2419–2422Robins B, Dautenhahn K (2014) Tactile interactions with a humanoid robot: novel play scenario implementations with children with autism. Int J Soc Robot 6:397–415. https://doi.org/10.1007/s12369-014-0228-0Robins B, Dautenhahn K, Te Boekhorst R, Nehaniv CL (2008) Behaviour delay and robot expressiveness in child–robot interactions: a user study on interaction kinesics. In: Proceedings of the 3rd ACMIEEE international conference on human robot interaction, pp 17–24. https://doi.org/10.1145/1349822.1349826Ruiz J, Li Y, Lank E (2011) User-defined motion gestures for mobile interaction. In: Proceedings of the 2011 annual conference on human factors in computing systems—CHI’11. ACM Press, New York, p 197Rust K, Malu M, Anthony L, Findlater L (2014) Understanding childdefined gestures and children’s mental models for touchscreen tabletop interaction. In: Proceedings of the 2014 conference on interaction design and children—IDC’14. ACM Press, New York, pp 201–204Salter T, Dautenhahn K, Te Boekhorst R (2006) Learning about natural human-robot interaction styles. Robot Auton Syst 54:127–134. https://doi.org/10.1016/j.robot.2005.09.022Sanghvi J, Castellano G, Leite I et al (2011) Automatic analysis of affective postures and body motion to detect engagement with a game companion. In: Proceedings of the 6th international conference on human–robot interaction—HRI’11. ACM Press, New York, pp 305–311Sanna A, Lamberti F, Paravati G, Manuri F (2013) A Kinect-based natural interface for quadrotor control. Entertain Comput 4:179–186. https://doi.org/10.1016/j.entcom.2013.01.001Sato E, Yamaguchi T, Harashima F (2007) Natural interface using pointing behavior for human–robot gestural interaction. IEEE Trans Ind Electron 54:1105–1112. https://doi.org/10.1109/TIE.2007.892728Schaper M-M, Pares N (2016) Making sense of body and space through full-body interaction design. In: Proceedings of the 15th international conference on interaction design and children—IDC’16. ACM Press, New York, pp 613–618Schaper M-M, Malinverni L, Pares N (2015) Sketching through the body: child-generated gestures in full-body interaction design. In: Proceedings of the 14th international conference on interaction design and children—IDC’15. ACM Press, New York, pp 255–258Seyed T, Burns C, Costa Sousa M et al (2012) Eliciting usable gestures for multi-display environments. In: Proceedings of the 2012 ACM international conference on interactive tabletops and surfaces—ITS’12. ACM Press, New York, p 41Shimon SSA, Morrison-Smith S, John N et al (2015) Exploring user-defined back-of-device gestures for mobile devices. In: Proceedings of the 17th international conference on human–computer interaction with mobile devices and services—MobileHCI’15. ACM Press, New York, pp 227–232Sipitakiat A, Nusen N (2012) Robo-blocks: a tangible programming system with debugging for children. In: Proceedings of the 11th international conference on interaction design and children—IDC’12. ACM Press, New York, p 98Soler-Adillon J, Ferrer J, Pares N (2009) A novel approach to interactive playgrounds: the interactive slide project. In: Proceedings of the 8th international conference on interaction design and children—IDC’09. ACM Press, New York, pp 131–139Stiefelhagen R, Fogen C, Gieselmann P et al (2004) Natural human–robot interaction using speech, head pose and gestures. In: 2004 IEEE/RSJ international conference on intelligent robots and systems (IROS) (IEEE Cat. No. 04CH37566). IEEE, New York, pp 2422–2427Subrahmanyam K, Greenfield PM (1994) Effect of video game practice on spatial skills in girls and boys. J Appl Dev Psychol 15:13–32. https://doi.org/10.1016/0193-3973(94)90004-3Sugiyama J, Tsetserukou D, Miura J (2011) NAVIgoid: robot navigation with haptic vision. In: SIGGRAPH Asia 2011 emerging technologies SA’11, vol 15, p 4503. https://doi.org/10.1145/2073370.2073378Takahashi T, Morita M, Tanaka F (2012) Evaluation of a tricycle-style teleoperational interface for children: a comparative experiment with a video game controller. In: 2012 IEEE RO-MAN: the 21st IEEE international symposium on robot and human interactive communication. IEEE, New York, pp 334–338Tanaka F, Takahashi T (2012) A tricycle-style teleoperational interface that remotely controls a robot for classroom children. In: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction—HRI’12. ACM Press, New York, pp 255–256Tjaden L, Tong A, Henning P et al (2012) Children’s experiences of dialysis: a systematic review of qualitative studies. Arch Dis Child 97:395–402. https://doi.org/10.1136/archdischild-2011-300639Vatavu R-D (2012) User-defined gestures for free-hand TV control. In: Proceedings of the 10th European conference on interactive TV and video—EuroiTV’12. ACM Press, New York, pp 45–48Vatavu R-D (2017) Smart-Pockets: body-deictic gestures for fast access to personal data during ambient interactions. Int J Hum Comput Stud 103:1–21. https://doi.org/10.1016/j.ijhcs.2017.01.005Vatavu R-D, Wobbrock JO (2015) Formalizing agreement analysis for elicitation studies: new measures, significance test, and toolkit. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems—CHI’15. ACM Press, New York, pp 1325–1334Vatavu R-D, Wobbrock JO (2016) Between-subjects elicitation studies: formalization and tool support. In: Proceedings of the 2016 CHI conference on human factors in computing systems—CHI’16. ACM Press, New York, pp 3390–3402Voyer D, Voyer S, Bryden MP (1995) Magnitude of sex differences in spatial abilities: a meta-analysis and consideration of critical variables. Psychol Bull 117:250–270. https://doi.org/10.1037/0033-2909.117.2.250Wainer J, Robins B, Amirabdollahian F, Dautenhahn K (2014) Using the humanoid robot KASPAR to autonomously play triadic games and facilitate collaborative play among children with autism. IEEE Trans Auton Ment Dev 6:183–199. https://doi.org/10.1109/TAMD.2014.2303116Wang Y, Zhang L (2015) A track-based gesture recognition algorithm for Kinect. Appl Mech Mater 738–7399:334–338. https://doi.org/10.4028/www.scientific.net/AMM.738-739.334

    Envisioning Future Playful Interactive Environments for Animals

    Full text link
    The final publication is available at Springer via http://dx.doi.org/10.1007/978-981-287-546-4_6Play stands as one of the most natural and inherent behavior among the majority of living species, specifically humans and animals. Human play has evolved significantly over the years, and so have done the artifacts which allow us to play: from children playing tag games without any tools other than their bodies, to modern video games using haptic and wearable devices to augment the playful experience. However, this ludic revolution has not been the same for the humans’ closest companions, our pets. Recently, a new discipline inside the human–computer interaction (HCI) community, called animal–computer interaction (ACI), has focused its attention on improving animals’ welfare using technology. Several works in the ACI field rely on playful interfaces to mediate this digital communication between animals and humans. Until now, the development of these interfaces only comprises a single goal or activity, and its adaptation to the animals’ needs requires the developers’ intervention. This work analyzes the existing approaches, proposing a more generic and autonomous system aimed at addressing several aspects of animal welfare at a time: Intelligent Playful Environments for Animals. The great potential of these systems is discussed, explaining how incorporating intelligent capabilities within playful environments could allow learning from the animals’ behavior and automatically adapt the game to the animals’ needs and preferences. The engaging playful activities created with these systems could serve different purposes and eventually improve animals’ quality of life.This work was partially funded by the Spanish Ministry of Science andInnovation under the National R&D&I Program within the projects Create Worlds (TIN2010-20488) and SUPEREMOS (TIN2014-60077-R), and from Universitat Politècnica de València under Project UPV-FE-2014-24. It also received support from a postdoctoral fellowship within theVALi+d Program of the Conselleria d’Educació, Cultura I Esport (Generalitat Valenciana) awarded to Alejandro Catalá (APOSTD/2013/013). The work of Patricia Pons has been supported by the Universitat Politècnica de València under the “Beca de Excelencia” program and currently by an FPU fellowship from the Spanish Ministry of Education, Culture, and Sports (FPU13/03831).Pons Tomás, P.; Jaén Martínez, FJ.; Catalá Bolós, A. (2015). Envisioning Future Playful Interactive Environments for Animals. En More Playful User Interfaces: Interfaces that Invite Social and Physical Interaction. Springer. 121-150. https://doi.org/10.1007/978-981-287-546-4_6S121150Alfrink, K., van Peer, I., Lagerweij H, et al.: Pig Chase. Playing with Pigs project. (2012) www.playingwithpigs.nlAmat, M., Camps, T., Le, Brech S., Manteca, X.: Separation anxiety in dogs: the implications of predictability and contextual fear for behavioural treatment. Anim. Welf. 23(3), 263–266 (2014). doi: 10.7120/09627286.23.3.263Barker, S.B., Dawson, K.S.: The effects of animal-assisted therapy on anxiety ratings of hospitalized psychiatric patients. Psychiatr. Serv. 49(6), 797–801 (1998)Bateson, P., Martin, P.: Play, Playfulness, Creativity and Innovation. Cambridge University Press, New York (2013)Bekoff, M., Allen, C.: Intentional communication and social play: how and why animals negotiate and agree to play. In: Bekoff, M., Byers, J.A. (eds.) Animal Play Evolutionary. Comparative and Ecological Perspectives, pp. 97–114. Cambridge University Press, New York (1997)Burghardt, G.M.: The Genesis of Animal Play. Testing the Limits. MIT Press, Cambridge (2006)Catalá, A., Pons, P., Jaén, J., et al.: A meta-model for dataflow-based rules in smart environments: evaluating user comprehension and performance. Sci. Comput. Prog. 78(10), 1930–1950 (2013). doi: 10.1016/j.scico.2012.06.010Cheok, A.D., Tan, R.T.K.C., Peiris, R.L., et al.: Metazoa ludens: mixed-reality interaction and play for small pets and humans. IEEE Trans. Syst. Man. Cybern.—Part A Syst. Hum. 41(5), 876–891 (2011). doi: 10.1109/TSMCA.2011.2108998Costello, B., Edmonds, E.: A study in play, pleasure and interaction design. In: Proceedings of the 2007 Conference on Designing Pleasurable Products and Interfaces, pp. 76–91 (2007)Csikszentmihalyi, M.: Beyond Boredom and Anxiety. The Experience of Play in Work and Games. Jossey-Bass Publishers, Hoboken (1975)Filan, S.L., Llewellyn-Jones, R.H.: Animal-assisted therapy for dementia: a review of the literature. Int. Psychogeriatr. 18(4), 597–611 (2006). doi: 10.1017/S1041610206003322García-Herranz, M., Haya, P.A., Alamán, X.: Towards a ubiquitous end-user programming system for smart spaces. J. Univ. Comput. Sci. 16(12), 1633–1649 (2010). doi: 10.3217/jucs-016-12-1633Hirskyj-Douglas, I., Read, J.C.: Who is really in the centre of dog computer interaction? In: Adjunct Proceedings of the 11th Conference on Advances in Computer Entertainment—Workshop on Animal Human Computer Interaction (2014)Hu, F., Silver, D., Trude, A.: LonelyDog@Home. In: International Conference Web Intelligence Intelligent Agent Technology—Workshops, 2007 IEEE/WIC/ACM IEEE, pp. 333–337, (2007)Huizinga, J.: Homo Ludens. Wolters-Noordhoff, Groningen (1985)Kamioka, H., Okada, S., Tsutani, K., et al.: Effectiveness of animal-assisted therapy: a systematic review of randomized controlled trials. Complement. Ther. Med. 22(2), 371–390 (2014). doi: 10.1016/j.ctim.2013.12.016Lee, S.P., Cheok, A.D., James, T.K.S., et al.: A mobile pet wearable computer and mixed reality system for human–poultry interaction through the internet. Pers. Ubiquit. Comput. 10(5), 301–317 (2006). doi: 10.1007/s00779-005-0051-6Leo, K., Tan, B.: User-tracking mobile floor projection virtual reality game system for paediatric gait and dynamic balance training. In: Proceedings of the 4th International Convention on Rehabilitation Engineering and Assistive Technology pp. 25:1–25:4 (2010)Mancini, C.: Animal-computer interaction: a manifesto. Mag. Interact. 18(4), 69–73 (2011). doi: 10.1145/1978822.1978836Mancini, C.: Animal-computer interaction (ACI): changing perspective on HCI, participation and sustainability. CHI ’13 Extended Abstracts on Human Factors in Computing Systems. ACM Press, New York, pp. 2227–2236 (2013)Mancini, C., van der Linden, J.: UbiComp for animal welfare: envisioning smart environments for kenneled dogs. In: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing, pp. 117–128 (2014)Mancini, C., Harris, R., Aengenheister, B., Guest, C.: Re-centering multispecies practices: a canine interface for cancer detection dogs. In: Proceedings of the SIGCHI Conference on Human Factors in Computing System, pp. 2673–2682 (2015)Mancini, C., van der Linden, J., Bryan, J., Stuart, A.: Exploring interspecies sensemaking: dog tracking semiotics and multispecies ethnography. In: Proceedings of the 2012 ACM Conference on Ubiquitous Computing—UbiComp ’12. ACM Press, New York, pp. 143–152 (2012)Mankoff, D., Dey, A.K., Mankoff, J., Mankoff, K.: Supporting interspecies social awareness: using peripheral displays for distributed pack awareness. In: Proceedings of the 18th Annual ACM Symposium on User interface Software and Technology, pp. 253–258 (2005)Maternaghan, C., Turner, K.J.: A configurable telecare system. In: Proceedings of the 4th International Conference on Pervasive Technologies Related to Assistive Environments—PETRA ’11. ACM Press, New York, pp. 14:1–14:8 (2011)Matsuzawa, T.: The Ai project: historical and ecological contexts. Anim. Cogn. 6(4), 199–211 (2003). doi: 10.1007/s10071-003-0199-2McGrath, R.E.: Species-appropriate computer mediated interaction. CHI ‘09 Extended Abstracts on Human Factors in Computing Systems. ACM Press, New York, pp. 2529–2534 (2009)Mocholí, J.A., Jaén, J., Catalá, A.: A model of affective entities for effective learning environments. In: Innovations in Hybrid Intelligent Systems, pp. 337–344 (2007)Nijholt, A. (ed.): Playful User Interfaces. Springer, Singapore (2014)Norman, D.A.: The invisible computer. MIT Press, Cambridge (1998)Noz, F., An, J.: Cat cat revolution: an interspecies gaming experience. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2661–2664 (2011)Paldanius, M., Kärkkäinen, T., Väänänen-Vainio-Mattila, K., et al.: Communication technology for human-dog interaction: exploration of dog owners’ experiences and expectations. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM Press, New York, pp. 2641–2650 (2011)Picard, R.W.: Affective Computing. MIT Press, Cambridge (1997)Pons, P., Jaén, J., Catalá, A.: Animal ludens: building intelligent playful environments for animals. In: Adjunct Proceedings of the 11th Conference on Advances in Computer Entertainment—Workshop on Animal Human Computer Interaction (2014)Resner, B.: Rover@Home: Computer Mediated Remote Interaction Between Humans and Dogs. M.Sc. thesis, Massachusetts Institute of Technology, Cambridge (2001)Ritvo, S.E., Allison, R.S.: Challenges related to nonhuman animal-computer interaction: usability and “liking”. In: Adjunct Proceedings of the 11th Conference on Advances in Computer Entertainment—Workshop on Animal Human Computer Interaction (2014)Robinson, C., Mncini, C., Van Der Linden, J., et al.: Canine-centered interface design: supporting the work of diabetes alert dogs. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 3757–3766 (2014)Rumbaugh, D.M.: Language Learning by a Chimpanzee: The LANA Project. Academic Press, New York (1977)Rumbaugh, D.M.: Apes and their future in comparative psychology. Eye Psi Chi 18(1), 16–19 (2013)Rumbaugh, D.M., Gill, T.V., Brown, J.V., et al.: A computer-controlled language training system for investigating the language skills of young apes. Behav. Res. Methods Instrum. 5(5), 385–392 (1973)Schwartz, S.: Separation anxiety syndrome in cats: 136 cases (1991–2000). J. Am. Vet. Med. Assoc. 220(7), 1028–1033 (2002). doi: 10.2460/javma.2002.220.1028Schwartz, S.: Separation anxiety syndrome in dogs and cats. J. Am. Vet. Med. Assoc. 222(11), 1526–1532 (2003)Solomon, O.: What a dog can do: children with autism and therapy dogs in social interaction. Ethos J. Soc. Psychol. Anthropol. 38(1), 143–166 (2010). doi: 10.1111/j.1548-1352.2010.01085.xTeh, K.S., Lee, S.P., Cheok, A.D.: Poultry. Internet: a remote human-pet interaction system. In: CHI ’06 Extended Abstracts on Human Factors in Computing Systems, pp. 251–254 (2006)Väätäjä, H., Pesonen, E.: Ethical issues and guidelines when conducting HCI studies with animals. In: CHI ’13 Extended Abstracts on Human Factors in Computing Systems, pp. 2159–2168 (2013)Väätäjä, H.: Animal welfare as a design goal in technology mediated human-animal interaction—opportunities with haptics. In: Adjunct Proceedings of the 11th Conference on Advances in Computer Entertainment—Workshop on Animal Human Computer Interaction (2014)Weilenmann, A., Juhlin, O.: Understanding people and animals. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems—CHI ’11. ACM Press, New York, pp. 2631–2640 (2011)Weiser, M.: The computer for the 21st century. Sci. Am. 265(3), 94–104 (1991)Westerlaken, M., Gualeni, S., Geurtsen, A.: Grounded zoomorphism: an evaluation methodology for ACI design. In: Adjunct Proceedings of the 11th Conference on Advances in Computer Entertainment—Workshop on Animal Human Computer Interaction (2014)Westerlaken, M., Gualeni, S.: Felino: the philosophical practice of making an interspecies videogame. Philosophy of Computer Games Conference, pp. 1–12 (2014)Wingrave, C.A., Rose, J., Langston, T., LaViola, J.J.J.: Early explorations of CAT: canine amusement and training. In: CHI ’10 Extended Abstracts on Human Factors in Computing Systems, pp. 2661–2669 (2010

    Beyond the limits of digital interaction: should animals play with interactive environments?

    Full text link
    Our digital world evolves towards ubiquitous and intuitive scenarios, filled with interconnected and transparent computing devices which ease our daily activities. We have approached this evolution of technology in a strictly human-centric manner. There are, however, plenty of species, among them our pets, which could also profit from these technological advances. A new field in Computer Science, called Animal-Computer Interaction (ACI), aims at filling this technological gap by developing systems and interfaces specifically designed for animals. This paper envisions how ACI could be extended to enhance the most natural animal behavior: play. This work explains how interactive environments could become playful scenarios where animals enjoy, learn and interact with technology, improving their wellbeingThis work is partially funded by the Spanish Ministry of Science and Innovation under the National R&D&I Program within the project CreateWorlds (TIN2010-20488). The work of Patricia Pons is supported by an FPU fellowship from the Spanish Ministry of Education, Culture and Sports (FPU13/03831). It also received support from a postdoctoral fellowship within the VALi+d Program of the Conselleria d’Educació, Cultura I Esport (Generalitat Valenciana) awarded to Alejandro Catalá (APOSTD/2013/013). We also thank the Valencian Society for the Protection of Animals and Plants (SVPAP) for their cooperation.Pons Tomás, P.; Jaén Martínez, FJ.; Catalá Bolós, A. (2015). Beyond the limits of digital interaction: should animals play with interactive environments?. ACM. http://hdl.handle.net/10251/65361

    Developing a depth-based tracking systems for interactive playful environments with animals

    Full text link
    © ACM 2015. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in ACM. Proceedings of the 12th International Conference on Advances in Computer Entertainment Technology (p. 59). http://dx.doi.org/10.1145/2832932.2837007.[EN] Digital games for animals within Animal Computer Interaction are usually single-device oriented, however richer interactions could be delivered by considering multimodal environments and expanding the number of technological elements involved. In these playful ecosystems, animals could be either alone or accompanied by human beings, but in both cases the system should react properly to the interactions of all the players, creating more engaging and natural games. Technologically-mediated playful scenarios for animals will therefore require contextual information about the game participants, such as their location or body posture, in order to suitably adapt the system reactions. This paper presents a depth-based tracking system for cats capable of detecting their location, body posture and field of view. The proposed system could also be extended to locate and detect human gestures and track small robots, becoming a promising component in the creation of intelligent interspecies playful environments.Work supported by the Spanish Ministry of Economy and Competitiveness and funded by the EDRF-FEDER (TIN2014-60077-R). The work of Patricia Pons has been supported by a national grant from the Spanish MECD (FPU13/03831). Alejandro Catalá also received support from a VALi+d fellowship from the GVA (APOSTD/2013/013). Special thanks to our cat participants, their owners, and our feline caretakers and therapistsPons Tomás, P.; Jaén Martínez, FJ.; Catalá Bolós, A. (2015). Developing a depth-based tracking systems for interactive playful environments with animals. ACM. https://doi.org/10.1145/2832932.2837007SJan Bednarik and David Herman. 2015. Human gesture recognition using top view depth data obtained from Kinect sensor.Excel. - Student Conf. Innov. Technol. Sci. IT, 1--8.Hrvoje Benko, Andrew D. Wilson, Federico Zannier, and Hrvoje Benko. 2014. Dyadic projected spatial augmented reality.Proc. 27th Annu. ACM Symp. User interface Softw. Technol. - UIST '14, 645--655.Alper Bozkurt, David L Roberts, Barbara L Sherman, et al. 2014. Toward Cyber-Enhanced Working Dogs for Search and Rescue.IEEE Intell. Syst. 29, 6, 32--39.Rita Brugarolas, Robert T. Loftin, Pu Yang, David L. Roberts, Barbara Sherman, and Alper Bozkurt. 2013. Behavior recognition based on machine learning algorithms for a wireless canine machine interface.2013 IEEE Int. Conf. Body Sens. Networks, 1--5.Adrian David Cheok, Roger Thomas K C Tan, R. L. Peiris, et al. 2011. Metazoa Ludens: Mixed-Reality Interaction and Play for Small Pets and Humans.IEEE Trans. Syst. Man, Cybern. - Part A Syst. Humans41, 5, 876--891.Amanda Hodgson, Natalie Kelly, and David Peel. 2013. Unmanned aerial vehicles (UAVs) for surveying Marine Fauna: A dugong case study.PLoS One8, 11, 1--15.Gang Hu, Derek Reilly, Mohammed Alnusayri, Ben Swinden, and Qigang Gao. 2014. DT-DT: Top-down Human Activity Analysis for Interactive Surface Applications.Proc. Ninth ACM Int. Conf. Interact. Tabletops Surfaces - ITS '14, 167--176.Brett R Jones, Hrvoje Benko, Eyal Ofek, and Andrew D. Wilson. 2013. IllumiRoom: Peripheral Projected Illusions for Interactive Experiences.Proc. SIGCHI Conf. Hum. Factors Comput. Syst. - CHI '13, 869--878.Brett Jones, Lior Shapira, Rajinder Sodhi, et al. 2014. RoomAlive: magical experiences enabled by scalable, adaptive projector-camera units.Proc. 27th Annu. ACM Symp. User Interface Softw. Technol. - UIST '14, 637--644.Cassim Ladha, Nils Hammerla, Emma Hughes, Patrick Olivier, and Thomas Ploetz. 2013. Dog's Life: Wearable Activity Recognition for Dogs.Proc. 2013 ACM Int. Jt. Conf. Pervasive Ubiquitous Comput. - UbiComp'13, 415.Shang Ping Lee, Adrian David Cheok, Teh Keng Soon James, et al. 2006. A mobile pet wearable computer and mixed reality system for human--poultry interaction through the internet.Pers. Ubiquitous Comput. 10, 5, 301--317.Clara Mancini, Janet van der Linden, Jon Bryan, and Andrew Stuart. 2012. Exploring interspecies sensemaking: Dog Tracking Semiotics and Multispecies Ethnography.Proc. 2012 ACM Conf. Ubiquitous Comput. - UbiComp '12, 143--152.Clara Mancini. 2011. Animal-computer interaction: a manifesto.Mag. Interact. 18, 4, 69--73.Clara Mancini. 2013. Animal-computer interaction (ACI): changing perspective on HCI, participation and sustainability.CHI '13 Ext. Abstr. Hum. Factors Comput. Syst., 2227--2236.Steve North, Carol Hall, Amanda Roshier, and Clara Mancini. 2015. HABIT: Horse Automated Behaviour Identification Tool -- A Position Paper.Proc. Br. Hum. Comput. Interact. Conf. - Anim. Comput. Interact. Work., 1--4.Mikko Paldanius, Tuula Kärkkäinen, Kaisa Väänänen-Vainio-Mattila, Oskar Juhlin, and Jonna Häkkilä. 2011. Communication technology for human-dog interaction: exploration of dog owners' experiences and expectations.Proc. SIGCHI Conf. Hum. Factors Comput. Syst., 2641--2650.Patricia Pons, Javier Jaen, and Alejandro Catala. Multimodality and Interest Grabbing: Are Cats Ready for the Game?Submitt. to Int. J. Human-Computer Stud. Spec. Issue Anim. Comput. Interact. (under Rev).Patricia Pons, Javier Jaen, and Alejandro Catala. 2014. Animal Ludens: Building Intelligent Playful Environments for Animals.Proc. 2014 Work. Adv. Comput. Entertain. Conf. - ACE '14 Work., 1--6.Patricia Pons, Javier Jaen, and Alejandro Catala. 2015. Envisioning Future Playful Interactive Environments for Animals. InMore Playful User Interfaces, Anton Nijholt (ed.). Springer, 121--150.Rui Trindade, Micaela Sousa, Cristina Hart, Nádia Vieira, Roberto Rodrigues, and João França. 2015. Purrfect Crime.Proc. 33rd Annu. ACM Conf. Ext. Abstr. Hum. Factors Comput. Syst. - CHI EA '15, 93--96.Jessica van Vonderen. 2015. Drones with heat-tracking cameras used to monitor koala population. Retrieved July 1, 2015 from http://www.abc.net.au/news/2015-02-24/drones-to-help-threatened-species-koalas-qut/6256558Alexandra Weilenmann and Oskar Juhlin. 2011. Understanding people and animals: the use of a positioning system in ordinary human-canine interaction.Proc. 2011 Annu. Conf. Hum. factors Comput. Syst. - CHI '11, 2631--2640.Chadwick A. Wingrave, J. Rose, Todd Langston, and Joseph J. Jr. LaViola. 2010. Early explorations of CAT: canine amusement and training.CHI '10 Ext. Abstr. Hum. Factors Comput. Syst., 2661--2669.Kyoko Yonezawa, Takashi Miyaki, and Jun Rekimoto. 2009. Cat@Log: sensing device attachable to pet cats for supporting human-pet interaction.Proc. Int. Conf. Adv. Comput. Enterntainment Technol. - ACE '09, 149--156.2013. ZOO Boomer balls. Retrieved July 1, 2015 from https://www.youtube.com/watch?v=Od_Lm8U5W4

    Assessing machine learning classifiers for the detection of animals' behavior using depth-based tracking

    Full text link
    [EN] There is growing interest in the automatic detection of animals' behaviors and body postures within the field of Animal Computer Interaction, and the benefits this could bring to animal welfare, enabling remote communication, welfare assessment, detection of behavioral patterns, interactive and adaptive systems, etc. Most of the works on animals' behavior recognition rely on wearable sensors to gather information about the animals' postures and movements, which are then processed using machine learning techniques. However, non-wearable mechanisms such as depth-based tracking could also make use of machine learning techniques and classifiers for the automatic detection of animals' behavior. These systems also offer the advantage of working in set-ups in which wearable devices would be difficult to use. This paper presents a depth-based tracking system for the automatic detection of animals' postures and body parts, as well as an exhaustive evaluation on the performance of several classification algorithms based on both a supervised and a knowledge-based approach. The evaluation of the depth -based tracking system and the different classifiers shows that the system proposed is promising for advancing the research on animals' behavior recognition within and outside the field of Animal Computer Interaction. (C) 2017 Elsevier Ltd. All rights reserved.This work is funded by the European Development Regional Fund (EDRF-FEDER) and supported by Spanish MINECO with Project TIN2014-60077-R. It also received support from a postdoctoral fellowship within the VALi+d Program of the Conselleria d'Educacio, Cultura I Esport (Generalitat Valenciana) awarded to Alejandro Catala (APOSTD/2013/013). The work of Patricia Pons is supported by a national grant from the Spanish MECD (FPU13/03831). Special thanks to our cat participants and their owners, and many thanks to our feline caretakers and therapists, Olga, Asier and Julia, for their valuable collaboration and their dedication to animal wellbeing.Pons Tomás, P.; Jaén Martínez, FJ.; Catalá Bolós, A. (2017). Assessing machine learning classifiers for the detection of animals' behavior using depth-based tracking. Expert Systems with Applications. 86:235-246. https://doi.org/10.1016/j.eswa.2017.05.063S2352468

    Customizing smart environments: a tabletop approach

    Full text link
    Smart environments are becoming a reality in our society and the number of intelligent devices integrated in these spaces is in-creasing very rapidly. As the combination of intelligent elements will open a wide range of new opportunities to make our lives easier, final users should be provided with a simplified method of handling complex intelligent features. Specifying behavior in these environments can be difficult for non-experts, so that more efforts should be directed towards easing the customization tasks. This work presents an entirely visual rule editor based on dataflow expressions for interactive tabletops which allows be-havior to be specified in smart environments. An experiment was carried out aimed at evaluating the usability of the editor in terms of non-programmers understanding of the abstractions and concepts involved in the rule model, ease of use of the pro-posed visual interface and the suitability of the interaction mechanisms implemented in the editing tool. The study revealed that users with no previous programming experience were able to master the proposed rule model and editing tool for specifying be-havior in the context of a smart home, even though some minor usability issues were detected.We would like to thank all the volunteers that participated in the empirical study. Our thanks are also due to the ASIC/Polimedia team for their computer hardware support. This work was partially funded by the Spanish Ministry of Science and Innovation under the National R&D&I Program within the project CreateWorlds (TIN2010-20488). It also received support from a postdoctoral fellowship within the VALi+d Program of the Conselleria d'Educacio, Cultura I Esport (Generalitat Valenciana) awarded to Alejandro Catala (APOSTD/2013/013). The work of Patricia Pons has been supported by the Universitat Politecnica de Valencia under the "Beca de Excelencia" program, and currently by an FPU fellowship from the Spanish Ministry of Education, Culture and Sports (FPU13/03831).Pons Tomás, P.; Catalá Bolós, A.; Jaén Martínez, FJ. (2015). Customizing smart environments: a tabletop approach. Journal of Ambient Intelligence and Smart Environments. 7(4):511-533. https://doi.org/10.3233/AIS-150328S51153374[1]C. Becker, M. Handte, G. Schiele and K. Rothermel, PCOM – a component system for pervasive computing, in: Proc. of the Second IEEE International Conference on Pervasive Computing and Communications (PerCom’04), IEEE Computer Society, Washington, DC, USA, 2004, pp. 67–76.Bhatti, Z. W., Naqvi, N. Z., Ramakrishnan, A., Preuveneers, D., & Berbers, Y. (2014). Learning distributed deployment and configuration trade-offs for context-aware applications in Intelligent Environments. Journal of Ambient Intelligence and Smart Environments, 6(5), 541-559. doi:10.3233/ais-140274Bonino, D., & Corno, F. (2011). What would you ask to your home if it were intelligent? Exploring user expectations about next-generation homes. Journal of Ambient Intelligence and Smart Environments, 3(2), 111-126. doi:10.3233/ais-2011-0099[4]D. Bonino, F. Corno and L. Russis, A user-friendly interface for rules composition in intelligent environments, in: Ambient Intelligence – Software and Applications, Advances in Intelligent and Soft Computing, Vol. 92, Springer, Berlin, Heidelberg, 2011, pp. 213–217.[5]X. Carandang and J. Campbell, The design of a tangible user interface for a real-time strategy game, in: Proc. of the 34th International Conference on Information Systems (ICIS 2013), Association for Information Systems (AIS), 2013, pp. 3781–3790.Catalá, A., Garcia-Sanjuan, F., Jaen, J., & Mocholi, J. A. (2012). TangiWheel: A Widget for Manipulating Collections on Tabletop Displays Supporting Hybrid Input Modality. Journal of Computer Science and Technology, 27(4), 811-829. doi:10.1007/s11390-012-1266-4Catala, A., Pons, P., Jaen, J., Mocholi, J. A., & Navarro, E. (2013). A meta-model for dataflow-based rules in smart environments: Evaluating user comprehension and performance. Science of Computer Programming, 78(10), 1930-1950. doi:10.1016/j.scico.2012.06.010[8]C. Chen, Y. Xu, K. Li and S. Helal, Reactive programming optimizations in pervasive computing, in: Proc. of the 2010 10th IEEE/IPSJ International Symposium on Applications and the Internet (SAINT’10), IEEE Computer Society, Washington, DC, USA, 2010, pp. 96–104.Cook, D. J., Augusto, J. C., & Jakkula, V. R. (2009). Ambient intelligence: Technologies, applications, and opportunities. Pervasive and Mobile Computing, 5(4), 277-298. doi:10.1016/j.pmcj.2009.04.001Dey, A. K. (2009). Modeling and intelligibility in ambient environments. Journal of Ambient Intelligence and Smart Environments, 1(1), 57-62. doi:10.3233/ais-2009-0008[11]A.K. Dey, T. Sohn, S. Streng and J. Kodama, iCAP: Interactive prototyping of context-aware applications, in: Proc. of Pervasive Computing, Lecture Notes in Computer Science, Vol. 3968, Springer-Verlag, Berlin, Heidelberg, 2006, pp. 254–271.[12]N. Díaz, J. Lilius, M. Pegalajar and M. Delgado, Rapid prototyping of semantic applications in smart spaces with a visual rule language, in: Proc. of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication, ACM, New York, NY, USA, 2013, pp. 1335–1338.Gámez, N., & Fuentes, L. (2011). FamiWare: a family of event-based middleware for ambient intelligence. Personal and Ubiquitous Computing, 15(4), 329-339. doi:10.1007/s00779-010-0354-0García-Herranz, M., Alamán, X., & Haya, P. A. (2010). Easing the Smart Home: A rule-based language and multi-agent structure for end user development in Intelligent Environments. Journal of Ambient Intelligence and Smart Environments, 2(4), 437-438. doi:10.3233/ais-2010-0085[17]J. Good, K. Howland and K. Nicholson, Young people’s descriptions of computational rules in role-playing games: An empirical study, in: Proc. of the 2010 IEEE Symposium on Visual Languages and Human-Centric Computing, IEEE, 2010, pp. 67–74.Gouin-Vallerand, C., Abdulrazak, B., Giroux, S., & Dey, A. K. (2013). A context-aware service provision system for smart environments based on the user interaction modalities. Journal of Ambient Intelligence and Smart Environments, 5(1), 47-64. doi:10.3233/ais-120190[19]S. Holloway and C. Julien, The case for end-user programming of ubiquitous computing environments, in: Proc. of the FSE/SDP Workshop on Future of Software Engineering Research (FoSER’10), ACM, New York, NY, USA, 2010, pp. 167–172.Horn, M. S., Crouser, R. J., & Bers, M. U. (2011). Tangible interaction and learning: the case for a hybrid approach. Personal and Ubiquitous Computing, 16(4), 379-389. doi:10.1007/s00779-011-0404-2[21]M.S. Horn, E.T. Solovey, R.J. Crouser and R.J.K. Jacob, Comparing the use of tangible and graphical programming languages for informal science education, in: Proc. of the SIGCHI Conference on Human Factors in Computing Systems (CHI’09), ACM, New York, NY, USA, 2009, pp. 975–984.Kelleher, C., & Pausch, R. (2005). Lowering the barriers to programming. ACM Computing Surveys, 37(2), 83-137. doi:10.1145/1089733.1089734[23]J. Lee, L. Garduño, E. Walker and W. Burleson, A tangible programming tool for creation of context-aware applications, in: Proc. of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp’13), ACM, New York, NY, USA, 2013, pp. 391–400.Lézoray, J.-B., Segarra, M.-T., Phung-Khac, A., Thépaut, A., Gilliot, J.-M., & Beugnard, A. (2011). A design process enabling adaptation in pervasive heterogeneous contexts. Personal and Ubiquitous Computing, 15(4), 353-363. doi:10.1007/s00779-010-0356-y[25]B.Y. Lim and A.K. Dey, Assessing demand for intelligibility in context-aware applications, in: Proc. of the 11th International Conference on Ubiquitous Computing (Ubicomp’09), ACM, New York, NY, USA, 2009, pp. 195–204.[26]B.Y. Lim and A.K. Dey, Evaluating Intelligibility usage and usefulness in a context-aware application, in: Proc. of the 15th International Conference on Human-Computer, Lecture Notes in Computer Science, Vol. 8008, Springer, Berlin, Heidelberg, 2013, pp. 92–101.[28]P. Marshall, Do tangible interfaces enhance learning? in: Proc. of the 1st International Conference on Tangible and Embedded Interaction (TEI’07), ACM, New York, NY, USA, 2007, pp. 163–170.[30]C. Maternaghan and K.J. Turner, A configurable telecare system, in: Proc. of the 4th International Conference on Pervasive Technologies Related to Assistive Environments (PETRA’11), ACM, New York, NY, USA, 2011, pp. 14:1–14:8.[31]D.A. Norman, The Invisible Computer, MITT Press, Cambridge, MA, USA, 1998.PANE, J. F., RATANAMAHATANA, C. «ANN», & MYERS, B. A. (2001). Studying the language and structure in non-programmers’ solutions to programming problems. International Journal of Human-Computer Studies, 54(2), 237-264. doi:10.1006/ijhc.2000.0410[33]P. Pons, A. Catala, J. Jaen and J.A. Mocholi, DafRule: Un modelo de reglas enriquecido mediante flujos de datos para la definición visual de comportamiento reactivo de entidades virtuales, in: Actas de las Jornadas de Ingeniería del Software y Bases de Datos (JISBD 2011), 2011, 989–1002.Rasch, K. (2014). An unsupervised recommender system for smart homes. Journal of Ambient Intelligence and Smart Environments, 6(1), 21-37. doi:10.3233/ais-130242[35]K. Ryall, C. Forlines, C. Shen and M.R. Morris, Exploring the effects of group size and table size on interactions with tabletop shared-display groupware, in: Proc. of the 2004 ACM Conference on Computer Supported Cooperative Work, ACM, New York, NY, USA, 2004, pp. 284–293.Schmidt, A. (2000). Implicit human computer interaction through context. Personal Technologies, 4(2-3), 191-199. doi:10.1007/bf01324126[37]S.D. Scott, M. Sheelagh, T. Carpendale and K.M. Inkpen, Territoriality in collaborative tabletop workspaces, in: Proc. of the 2004 ACM Conference on Computer Supported Cooperative Work, ACM, New York, NY, USA, 2004, pp. 294–303.Sapounidis, T., & Demetriadis, S. (2013). Tangible versus graphical user interfaces for robot programming: exploring cross-age children’s preferences. Personal and Ubiquitous Computing, 17(8), 1775-1786. doi:10.1007/s00779-013-0641-7Shadbolt, N. (2003). Brain power. IEEE Intelligent Systems, 18(3), 2-3. doi:10.1109/mis.2003.1200718Shafti, L. S., Haya, P. A., García-Herranz, M., & Pérez, E. (2013). Inferring ECA-based rules for ambient intelligence using evolutionary feature extraction. Journal of Ambient Intelligence and Smart Environments, 5(6), 563-587. doi:10.3233/ais-130232[41]A. Strawhacker, A. Sullivan and M.U. Bers, TUI, GUI, HUI: Is a bimodal interface truly worth the sum of its parts? in: Proc. of the 12th International Conference on Interaction Design and Children, ACM, New York, NY, USA, 2013, pp. 309–312.Sylla, C., Branco, P., Coutinho, C., & Coquet, E. (2011). TUIs vs. GUIs: comparing the learning potential with preschoolers. Personal and Ubiquitous Computing, 16(4), 421-432. doi:10.1007/s00779-011-0407-zTeruel, M. A., Navarro, E., López-Jaquero, V., Montero, F., Jaen, J., & González, P. (2012). Analyzing the understandability of Requirements Engineering languages for CSCW systems: A family of experiments. Information and Software Technology, 54(11), 1215-1228. doi:10.1016/j.infsof.2012.06.001[44]E. Tse, J. Histon, S.D. Scott and S. Greenberg, Avoiding interference: How people use spatial separation and partitioning in SDG workspaces, in: Proc. of the 2004 ACM Conference on Computer Supported Cooperative Work, ACM, New York, NY, USA, 2004, pp. 252–261.[45]P. Tuddenham, D. Kirk and S. Izadi, Graspables revisited: Multi-touch vs. tangible input for tabletop displays in acquisition and manipulation tasks, in: Proc. of the SIGCHI Conference on Human Factors in Computing Systems (CHI’10), ACM, New York, NY, USA, 2010, pp. 2223–2232.[46]A. Uribarren, J. Parra, R. Iglesias, J.P. Uribe and D. López de Ipiña, A middleware platform for application configuration, adaptation and interoperability, in: Proc. of the 2008 Second IEEE International Conference on Self-Adaptive and Self-Organizing Systems Workshops, IEEE Computer Society, Washington, DC, USA, 2008, pp. 162–167.Weiser, M. (1991). The Computer for the 21st Century. Scientific American, 265(3), 94-104. doi:10.1038/scientificamerican0991-94[48]C. Wohlin, P. Runeson, M. Höst, M.C. Ohlsson, B. Regnell and A. Wesslén, Experimentation in Software Engineering: An Introduction, 1st edn, Kluwer Academic Publishers, Norwell, MA, USA, 2000.Zuckerman, O., & Gal-Oz, A. (2013). To TUI or not to TUI: Evaluating performance and preference in tangible vs. graphical user interfaces. International Journal of Human-Computer Studies, 71(7-8), 803-820. doi:10.1016/j.ijhcs.2013.04.00

    Animal Ludens: Building Intelligent Playful Environments for Animals

    Full text link
    © ACM (2014). This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in {Source Publication}, http://dx.doi.org/10.1145/2693787.2693794Looking for effective ways to understand how animals interact with computer-mediated systems, Animal-Computer Interaction (ACI) research should rely on the most natural and intrinsic behavior among the majority of living species: play. Animals are naturally motivated towards playing. Playful environments are, therefore, a promising scenario in which to start developing animal-centered ecosystems, and there are plenty of circumstances where playful environments could help to improve animals well-being. However, developing a custom system for each possible context remains unfeasible, and more appealing solutions are required. If playful environments were equipped with intelligent capabilities, they could learn from the animals behavior and automatically adapt themselves to the animals needs and preferences by creating engaging playful activities for different purposes. Hence, this work will define intelligent playful environments for animals and explain how Ambient Intelligence (AmI) can contribute to create adaptable playful experiences for animals in order to improve their quality of life.This work was partially funded by the Spanish Ministry of Science and Innovation under the National R&D&I Program within the project CreateWorlds (TIN2010-20488). It also received support from a postdoctoral fellowship within the VALi+d Program of the Conselleria d'Educació, Cultura i Esport (Generalitat Valenciana) awarded to Alejandro Catalá (APOSTD/2013/013). The work of Patricia Pons has been supported by the Universitat Politecnica de Valencia under the 'Beca de Excelencia" program, and currently by an FPU fellowship from the Spanish Ministry of Education, Culture and Sports (FPU13/03831).Pons Tomás, P.; Jaén Martínez, FJ.; Catalá Bolós, A. (2014). Animal Ludens: Building Intelligent Playful Environments for Animals. ACM. https://doi.org/10.1145/2693787.2693794SAlfrink, K., Peer, I. van, Lagerweij, H., Driessen, C., Bracke, M., and Copier, M. Pig Chase. Playing with Pigs project. 2012. www.playingwithpigs.nl.Amat, M., Camps, T., Brech, S. Le, and Manteca, X. Separation anxiety in dogs: the implications of predictability and contextual fear for behavioural treatment. Animal Welfare 23, 3 (2014), 263--266.Barker, S. B. and Dawson, K. S. The effects of animal-assisted therapy on anxiety ratings of hospitalized psychiatric patients. Psychiatric services 49, 6 (1998), 797--801.Bateson, P. and Martin, P. Play, Playfulness, Creativity and Innovation. Cambridge University Press, 2013.Bekoff, M. and Allen, C. Intentional communication and social play: how and why animals negotiate and agree to play. In Animal play: Evolutionary, comparative, and ecological perspectives. Cambridge University Press, 1997.Burghardt, G. M. The genesis of animal play. Testing the limits. MIT Press, 2006.Catalá, A., Jaén, J., Pons, P., and García-Sanjuan, F. Playful Creativity: Playing to Create Games on Surfaces. In A. Nijholt, ed., Playful User Interfaces. Springer Singapore, Singapore, 2014, 293--315.Catalá, A., Pons, P., Jaén, J., Mocholí, J. A., and Navarro, E. A meta-model for dataflow-based rules in smart environments: Evaluating user comprehension and performance. Science of Computer Programming 78, 10 (2013), 1930--1950.Cheok, A. D., Tan, R. T. K. C., Peiris, R. L., et al. Metazoa Ludens: Mixed-Reality Interaction and Play for Small Pets and Humans. IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans 41, 5 (2011), 876--891.Costello, B. and Edmonds, E. A Study in Play, Pleasure and Interaction Design. Proceedings of the 2007 conference on Designing pleasurable products and interfaces, (2007), 76--91.Csikszentmihalyi, M. Beyond Boredom and Anxiety. The Experience of Play in Work and Games. Jossey-Bass Publishers, 1975.Filan, S. L. and Llewellyn-Jones, R. H. Animal-assisted therapy for dementia: a review of the literature. International psychogeriatrics / IPA 18, 4 (2006), 597--611.Huizinga, J. Homo ludens. 1985.Kamioka, H., Okada, S., Tsutani, K., et al. Effectiveness of animal-assisted therapy: A systematic review of randomized controlled trials. Complementary therapies in medicine 22, 2 (2014), 371--390.Lee, S. P., Cheok, A. D., James, T. K. S., et al. A mobile pet wearable computer and mixed reality system for human--poultry interaction through the internet. Personal and Ubiquitous Computing 10, 5 (2006), 301--317.Mancini, C., van der Linden, J., Bryan, J., and Stuart, A. Exploring interspecies sensemaking: Dog Tracking Semiotics and Multispecies Ethnography. Proceedings of the 2012 ACM Conference on Ubiquitous Computing - UbiComp '12, ACM Press (2012), 143--152.Mancini, C. Animal-computer interaction: a manifesto. Magazine interactions 18, 4 (2011), 69--73.Mancini, C. Animal-computer interaction (ACI): changing perspective on HCI, participation and sustainability. CHI '13 Extended Abstracts on Human Factors in Computing Systems, ACM Press (2013), 2227--2236.Mankoff, D., Dey, A. K., Mankoff, J., and Mankoff, K. Supporting Interspecies Social Awareness: Using peripheral displays for distributed pack awareness. Proceedings of the 18th annual ACM symposium on User interface software and technology, (2005), 253--258.Matsuzawa, T. The Ai project: historical and ecological contexts. Animal cognition 6, 4 (2003), 199--211.McGrath, R. E. Species-appropriate computer mediated interaction. Proceedings of the 27th international conference extended abstracts on Human factors in computing systems - CHI EA '09, ACM Press (2009), 2529--2534.Norman, D. A. The invisible computer. MIT Press, Cambridge, MA, USA, 1998.Noz, F. and An, J. Cat Cat Revolution: An Interspecies Gaming Experience. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, (2011), 2661--2664.Paldanius, M., Kärkkäinen, T., Väänänen-Vainio-Mattila, K., Juhlin, O., and Häkkilä, J. Communication technology for human-dog interaction. Proceedings of the 2011 annual conference on Human factors in computing systems - CHI '11, ACM Press (2011), 2641--2650.Pons, P., Catala, A., and Jaen, J. Customizing smart environments: a tabletop approach. Journal of Ambient Intelligence and Smart Environments, in press.Rumbaugh, D. M. Language learning by a chimpanzee: the Lana Project. Academic Press, 1977.Schwartz, S. Separation anxiety syndrome in cats: 136 cases (1991--2000). Journal of the American Veterinary Medical Association 220, 7 (2002), 1028--1033.Schwartz, S. Separation anxiety syndrome in dogs and cats. Journal of the American Veterinary Medical Association 222, 11 (2003), 1526--1532.Solomon, O. What a Dog Can Do: Children with Autism and Therapy Dogs in Social Interaction. Ethos: Journal of the Society for Psychological Anthropology 38, 1 (2010), 143--166.Teh, K. S., Lee, S. P., and Cheok, A. D. Poultry. Internet: a remote human-pet interaction system. CHI '06 Extended Abstracts on Human Factors in Computing Systems, (2006), 251--254.Weilenmann, A. and Juhlin, O. Understanding people and animals. Proceedings of the 2011 annual conference on Human factors in computing systems - CHI '11, ACM Press (2011), 2631--2640.Weiser, M. The computer for the 21st century. Scientific American 265, 3 (1991), 94--104.Wingrave, C. A., Rose, J., Langston, T., and LaViola, J. J. J. Early explorations of CAT: canine amusement and training. CHI '10 Extended Abstracts on Human Factors in Computing Systems, (2010), 2661--2669.Young, J., Young, N., Greenberg, S., and Sharlin, E. Feline Fun Park: A Distributed Tangible Interface for Pets and Owners. 2013. https://www.youtube.com/watch?v=HB5LsSYkhCc
    corecore